LLM 25-Day Course

python · 25-day course · Starts from Beginner

25 Total Days
5 Beginner
13 Intermediate
7 Advanced
1

Day 1: Essential AI Terminology

Beginner

A comprehensive overview of key terms from AI, ML, DL differences to parameters, epochs, and loss functions

2

Day 2: NLP Fundamentals and Terminology

Beginner

Key NLP terms including tokens, corpus, embeddings, and attention explained with examples

3

Day 3: Understanding Tokenization and Embeddings

Beginner

Understand BPE and WordPiece algorithms and Word2Vec concepts, with hands-on practice using tiktoken

4

Day 4: Complete Guide to Transformer Architecture

Intermediate

Explaining the Transformer's encoder/decoder structure, Self-Attention, FFN, and normalization with diagrams

5

Day 5: Deep Dive into Attention Mechanisms

Intermediate

Implement Q/K/V, Scaled Dot-Product, Multi-Head Attention, and positional encoding from scratch with numpy

6

Day 6: What Are LLMs? The Principles of Large Language Models

Intermediate

Understanding the essence of LLMs from scaling laws and pre-training to parameter-scale characteristics and emergent abilities

7

Day 7: OpenAI GPT Series

Intermediate

Model selection guide for the latest OpenAI lineup, Responses API usage, and cost management tips

8

Day 8: Claude Series (Anthropic)

Intermediate

Explore the Claude model lineup selection criteria, Constitutional AI concepts, and Anthropic Messages API usage

9

Day 9: Meta Llama Series

Intermediate

Understand the evolution of the Llama family and run the latest open models locally

10

Day 10: Open-Source LLM Ecosystem Overview

Intermediate

A comprehensive overview of open-source LLMs including Qwen, DeepSeek, Phi, Yi, and Korean-language models SOLAR and EXAONE

11

Day 11: Multimodal Models

Advanced

Learn how to process images and text together using commercial and open-source multimodal model families

12

Day 12: Introduction to Hugging Face Ecosystem

Beginner

A comprehensive look at the Hugging Face ecosystem including Hub, Spaces, Transformers, Datasets, and PEFT

13

Day 13: Getting Started with Transformers Library

Beginner

From installation to hands-on experience with pipeline() for text classification, sentiment analysis, translation, and summarization.

14

Day 14: Practical Text Generation

Intermediate

Understand the key parameters of model.generate() and compare generation strategies including temperature, top_p, and top_k through hands-on practice.

15

Day 15: Advanced Tokenizer

Intermediate

Deep dive into encode/decode, special tokens, padding/truncation strategies, and chat_template with apply_chat_template().

16

Day 16: Prompt Engineering

Intermediate

Learn Zero-shot, Few-shot, Chain-of-Thought techniques, structured output (JSON), and 10 practical prompt tips.

17

Day 17: Building a RAG Pipeline

Advanced

Build a complete RAG pipeline from document loading to Retrieval-Augmented Generation using LangChain and ChromaDB.

18

Day 18: Fine-Tuning Concepts and Strategies

Intermediate

Understand the difference between Full Fine-Tuning and Parameter-Efficient FT, and learn decision criteria for fine-tuning vs prompting.

19

Day 19: Understanding LoRA and QLoRA

Advanced

Learn the principles of LoRA (Low-Rank Adaptation), rank/alpha parameters, and QLoRA's 4-bit quantization approach through code.

20

Day 20: PEFT Library in Practice

Advanced

Hands-on practice with LoraConfig setup, target module selection, trainable parameter verification, and model saving/loading using the PEFT library.

21

Day 21: Preparing Fine-Tuning Datasets

Intermediate

Learn Alpaca and ShareGPT dataset formats, data collection/cleaning/validation methods, and datasets library usage.

22

Day 22: SFT (Supervised Fine-Tuning) in Practice

Advanced

Execute actual training with trl library's SFTTrainer, and cover wandb monitoring and checkpoint management.

23

Day 23: Quantization Guide

Advanced

Compare precision levels from FP32 to INT4, explore GGUF/AWQ/GPTQ formats, and learn quantization methods with bitsandbytes and llama.cpp.

24

Day 24: Local Model Serving

Intermediate

Serve LLMs locally with Ollama, vLLM, Text Generation WebUI, and llama.cpp, and compare their performance.

25

Day 25: Mini Project -- Build Your Own AI Assistant

Advanced

Integrate everything learned throughout the course: from model selection to fine-tuning, RAG, serving, and Gradio UI.